On Characterization of Entropy Function via Information Inequalities

نویسندگان

  • Zhen Zhang
  • Raymond W. Yeung
چکیده

Given n discrete random variables = fX1; ; Xng, associated with any subset of f1; 2; ; ng, there is a joint entropy H(X ) where X = fXi: i 2 g. This can be viewed as a function defined on 2 2; ; ng taking values in [0; +1). We call this function the entropy function of . The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual informations implies that this function has the following property: for any two subsets and of f1; 2; ; ng H ( ) +H ( ) H ( [ ) +H ( \ ): These properties are the so-called basic information inequalities of Shannon’s information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2 1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground set f1; 2; ; ng. Let n be the cone in R 1 consisting of all vectors which have these three properties when they are viewed as functions defined on 2 2; ; . Let n be the set of all 2 1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. The question can be restated as: is it true that for any n, n = n? Here n stands for the closure of the set n. The answer is “yes” when n = 2 and 3 as proved in our previous work. Based on intuition, one may tend to believe that the answer should be “yes” for any n. The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: n is strictly smaller than n whenever n > 3. While this new inequality gives a nontrivial outer bound to the cone 4, an inner bound for 4 is also given. The inequality is also extended to any number of random variables.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A note on inequalities for Tsallis relative operator entropy

‎In this short note‎, ‎we present some inequalities for relative operator entropy which are generalizations of some results obtained by Zou [Operator inequalities associated with Tsallis relative operator entropy‎, ‎{em Math‎. ‎Inequal‎. ‎Appl.}‎ ‎{18} (2015)‎, ‎no‎. ‎2‎, ‎401--406]‎. ‎Meanwhile‎, ‎we also show some new lower and upper bounds for relative operator entropy and Tsallis relative o...

متن کامل

SHAPLEY FUNCTION BASED INTERVAL-VALUED INTUITIONISTIC FUZZY VIKOR TECHNIQUE FOR CORRELATIVE MULTI-CRITERIA DECISION MAKING PROBLEMS

Interval-valued intuitionistic fuzzy set (IVIFS) has developed to cope with the uncertainty of imprecise human thinking. In the present communication, new entropy and similarity measures for IVIFSs based on exponential function are presented and compared with the existing measures. Numerical results reveal that the proposed information measures attain the higher association with the existing me...

متن کامل

Facets of Entropy

Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannontype inequalities. If the number of ran...

متن کامل

Inequalities of Ando's Type for $n$-convex Functions

By utilizing different scalar equalities obtained via Hermite's interpolating polynomial, we will obtain lower and upper bounds for the difference in Ando's inequality and in the Edmundson-Lah-Ribariv c inequality for solidarities that hold for a class of $n$-convex functions. As an application, main results are applied to some operator means and relative operator entropy.

متن کامل

Generating Mathematical Inequalities via Fuzzy Information Measures

One of the important application areas of information theoretic measures is the development of some inequalities frequently used in information theory. The present communication deals with the development of such important inequalities through the maximization of entropy measures, especially while dealing with fuzzy distributions.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 44  شماره 

صفحات  -

تاریخ انتشار 1998